Evaluating a Model for Cache Conflict Miss Prediction
نویسندگان
چکیده
Cache conflict misses can cause severe degradation in application performance. Previous research has shown that for many scientific applications majority of cache misses are due to conflicts in cache. Although, conflicts in cache are a major concern for application performance it is often difficult to eliminate them completely. Eliminating conflict misses requires detailed knowledge of the cache replacement policy and the allocation of data in memory. This information is usually not available to the compiler. As such, the compiler has to resort to applying heuristics to try and minimize the occurrence of conflict misses. In this paper, we present a probabilistic method of estimating cache conflict misses for setassociative caches. We present a set of experiments evaluating the model and discuss the implications of the experimental results.
منابع مشابه
Expected Values for Cache Miss Rates for a Single Trace (Technical Summary)
The standard trace-driven cache simulation evaluates the miss rate of cache C on an address trace T for program P running on input data I with object-code address map M for P. We note that the measured miss rate depends significantly on the address mapping M set arbitrarily by the compiler and linker. In this paper, we remove the effect of the address-mapping on the miss rate by analyzing a sym...
متن کاملEvaluation of the Performance of Polynomial Set Index Functions
Randomising set index functions, randomisation functions for short, can significantly reduce conflict misses in data caches by placing cache blocks in a conflict-free manner. XOR-based functions are a broad class of functions that generally exhibit few conflict misses. Topham and González claimed that the sub-class of functions based on division of polynomials over contains those functions that...
متن کاملIncorporating Pattern Prediction Technique for Energy Efficient Filter Cache Design
A filter cache is proposed at a higher level than the L1 (main) cache in the memory hierarchy and is much smaller. The typical size of filter cache is of the order of 512 Bytes. Prediction algorithms popularly based upon the Next Fetch Prediction Table (NFPT) helps making the choice between the filter cache and the main cache. In this paper we introduce a new prediction mechanism for predicting...
متن کاملModel-Driven Automatic Tiling with Cache Associativity Lattices
Traditional compiler optimization theory distinguishes three separate classes of cache miss – Cold, Conflict and Capacity. Tiling for cache is typically guided by capacity miss counts. Models of cache function have not been effectively used to guide cache tiling optimizations due to model error and expense. Instead, heuristic or empirical approaches are used to select tilings. We argue that con...
متن کاملPhase-Based Miss Rate Prediction Across Program Inputs
Previous work shows the possibility of predicting the cache miss rate (CMR) for all inputs of a program. However, most optimization techniques need to know more than the miss rate of the whole program. Many of them benefit from knowing miss rate of each execution phase of a program for all inputs. In this paper, we describe a method that divides a program into phases that have a regular localit...
متن کامل